105 research outputs found

    Weighted Fractional and Integral k-Matching in Hypergraphs

    Get PDF
    We consider the problem of finding polynomial-time approximations of maximal weighted k-matchings in a hypergraph and investigate the relationship between the integral and fractional maxima of the corresponding 0-1 integer linear program and its LP-relaxation. We extend results of Raghavan, who gave a deterministic approximation algorithm for unweighted k-matching, to the weighted case and compare the so obtained lower bound for the ratio of the integer and fractional maximum with a lower bound of Aharoni, Erdös and Linial

    On Derandomized Approximation Algorithms

    Get PDF
    With the design of powerful randomized algorithms the transformation of a randomized algorithm or probabilistic existence result for combinatorial problems into an efficient deterministic algorithm (called derandomization) became an important issue in algorithmic discrete mathematics. In the last years several interesting examples of derandomization have been published, like discrepancy in hypergraph colouring, packing integer programs and an algorithmic version of the Lovász-Local-Lemma. In this paper the derandomization method of conditional probabilities of Raghavan/Spencer is extended using discrete martingales. As a main result pessimistic estimators are constructed for combinatorial approximation problems involving non-linear objective functions with bounded martingale differences. The theory gives polynomial-time algorithms for the linear and quadratic lattice approximation problem and a quadratic variant of the matrix balancing problem extending results of Spencer, Beck/Fiala and Raghavan. Finally a probabilistic existence result of Erdös on the average graph bisection is transformed into a deterministic algorithm

    Algorithmic Chernoff-Hoeffding Inequalities in Integer Programming

    Get PDF
    Proofs of classical Chernoff-Hoeffding bounds have been used to obtain polynomial-time implementations of Spencer's derandomization method of conditional probabilities on usual finite machine models: given m events whose complements are large deviations corresponding to weighted sums of n mutually independent Bernoulli trials, Raghavan's lattice approximation algorithm constructs for 0-1 weights and integer deviation terms in O(mn)-time a point for which all events hold. For rational weighted sums of Bernoulli trials the lattice approximation algorithm or Spencer's hyperbolic cosine algorithm are deterministic procedures, but a polynomial-time implementation was not known. We resolve this problem with an O(mn^2log frac{mn}{epsilon})-time algorithm, whenever the probability that all events hold is at least epsilon > 0. Since such algorithms simulate the proof of the underlying large deviation inequality in a constructive way, we call it the algorithmic version of the inequality. Applications to general packing integer programs and resource constrained scheduling result in tight and polynomial-time approximations algorithms

    A Streaming Algorithm for the Undirected Longest Path Problem

    Get PDF
    We present the first streaming algorithm for the longest path problem in undirected graphs. The input graph is given as a stream of edges and RAM is limited to only a linear number of edges at a time (linear in the number of vertices n). We prove a per-edge processing time of O(n), where a naive solution would have required Omega(n^2). Moreover, we give a concrete linear upper bound on the number of bits of RAM that are required. On a set of graphs with various structure, we experimentally compare our algorithm with three leading RAM algorithms: Warnsdorf (1823), Pohl-Warnsdorf (1967), and Pongrasz (2012). Although conducting only a small constant number of passes over the input, our algorithm delivers competitive results: with the exception of preferential attachment graphs, we deliver at least 71% of the solution of the best RAM algorithm. The same minimum relative performance of 71% is observed over all graph classes after removing the 10% worst cases. This comparison has strong meaning, since for each instance class there is one algorithm that on average delivers at least 84% of a Hamilton path. In some cases we deliver even better results than any of the RAM algorithms

    Automatic cloud classification of whole sky images

    Get PDF
    The recently increasing development of whole sky imagers enables temporal and spatial high-resolution sky observations. One application already performed in most cases is the estimation of fractional sky cover. A distinction between different cloud types, however, is still in progress. Here, an automatic cloud classification algorithm is presented, based on a set of mainly statistical features describing the color as well as the texture of an image. The k-nearest-neighbour classifier is used due to its high performance in solving complex issues, simplicity of implementation and low computational complexity. Seven different sky conditions are distinguished: high thin clouds (cirrus and cirrostratus), high patched cumuliform clouds (cirrocumulus and altocumulus), stratocumulus clouds, low cumuliform clouds, thick clouds (cumulonimbus and nimbostratus), stratiform clouds and clear sky. Based on the Leave-One-Out Cross-Validation the algorithm achieves an accuracy of about 97%. In addition, a test run of random images is presented, still outperforming previous algorithms by yielding a success rate of about 75%, or up to 88% if only "serious" errors with respect to radiation impact are considered. Reasons for the decrement in accuracy are discussed, and ideas to further improve the classification results, especially in problematic cases, are investigated

    The Price of Anarchy in Selfish Multicast Routing

    Get PDF
    We study the price of anarchy for selfish multicast routing games in directed multigraphs with latency functions on the edges, extending the known theory for the unicast situation, and exhibiting new phenomena not present in the unicast model...

    Randomized Algorithms for Mixed Matching and Covering in Hypergraphs in 3D Seed Reconstruction in Brachytherapy

    Get PDF
    Brachytherapy is a method developed in the 1980s for cancer radiation in organs like prostate, lung, or breast. At the Clinic of Radiotherapy (radiooncology), Christian Albrechts University of Kiel, among others, low dose radiation therapy (LDR therapy) for the treatment of prostate cancer is applied, where 25-80 small radioactive seeds are implanted in the affected organ. For the quality control of the treatment plan the locations of the seeds after the operation have to be checked. This is done by taking usually 3 X-ray photographs from three different angles (so-called 3-film technique). On the films the seeds appear as white lines. To determine the positions of the seeds in the organ the task now is to match the 3 different images (lines) representing the same seed. In this paper we first model the seed reconstruction problem as a minimum-weight perfect matching problem in a hypergraph and based on this as an integer linear program. To solve this integer program, an algorithm based on the so-called randomized rounding scheme introduced by Raghavan and Thompson (1987) is designed and applied. This algorithm is not only very fast, but accessible at least in part for a mathematical rigorous analysis. We give a partial analysis of the algorithm combining probabilistic and combinatorial methods, which shows that in the worst-case the solution produced is in some strong sense close to a minimum-weight perfect matching
    corecore